257 research outputs found
Recommended from our members
Reasoning with Data Flows and Policy Propagation Rules
Data-oriented systems and applications are at the centre of current developments of the World Wide Web. In these scenarios, assessing what policies propagate from the licenses of data sources to the output of a given data-intensive system is an important problem. Both policies and data flows can be described with Semantic Web languages. Although it is possible to define Policy Propagation Rules (PPR) by associating policies to data flow steps, this activity results in a huge number of rules to be stored and managed. In a recent paper, we introduced strategies for reducing the size of a PPR knowledge base by using an ontology of the possible relations between data objects, the Datanode ontology, and applying the (A)AAAA methodology, a knowledge engineering approach that exploits Formal Concept Analysis (FCA). In this article, we investigate whether this reasoning is feasible and how it can be performed. For this purpose, we study the impact of compressing a rule base associated with an inference mechanism on the performance of the reasoning process. Moreover, we report on an extension of the (A)AAAA methodology that includes a coherency check algorithm, that makes this reasoning possible. We show how this compression, in addition to being beneficial to the management of the knowledge base, also has a positive impact on the performance and resource requirements of the reasoning process for policy propagation
Recommended from our members
Linked Data for the Humanities: methods and techniques
So far, the impact of Linked Data in the Library and Cultural Heritage domain has been significant and testified by large scale efforts such as the one of Europeana. However, at a closer look, the impact of Semantic Web research on the Humanities has been discontinuous. Foundational techniques and methods developed by the SW community are still perceived as esoteric by many DH practitioners. In addition, more recent approaches have not been disseminated yet in the DH community. we propose a half-day tutorial on LD methods and techniques, to present the theoretical and technical foundations of Linked Data, to provide a reference collection of reusable tools to boost an effective adoption of LD in DH projects, and to showcase a set of innovative methods for extracting and linking data from texts
Clinical guidelines as plans: An ontological theory
Clinical guidelines are special types of plans realized by collective agents. We provide an ontological theory of such plans that is designed to support the construction of a framework in which guideline-based information systems can be employed in the management of workflow in health care organizations.
The framework we propose allows us to represent in formal terms how clinical guidelines are realized through the actions of are realized through the actions of individuals organized into teams. We provide various levels of implementation representing different levels of conformity on the part of health care organizations.
Implementations built in conformity with our framework are marked by two dimensions of flexibility that are designed to make them more likely to be accepted by health care professionals than standard guideline-based management systems. They do justice to the fact 1) that responsibilities within a health care organization are widely shared, and 2) that health care professionals may on different occasions be non-compliant with guidelines for a variety of well justified reasons.
The advantage of the framework lies in its built-in flexibility, its sensitivity to clinical context, and its ability to use inference tools based on a robust ontology. One disadvantage lies in its complicated implementation
Amnestic Forgery: an Ontology of Conceptual Metaphors
This paper presents Amnestic Forgery, an ontology for metaphor semantics,
based on MetaNet, which is inspired by the theory of Conceptual Metaphor.
Amnestic Forgery reuses and extends the Framester schema, as an ideal ontology
design framework to deal with both semiotic and referential aspects of frames,
roles, mappings, and eventually blending. The description of the resource is
supplied by a discussion of its applications, with examples taken from metaphor
generation, and the referential problems of metaphoric mappings. Both schema
and data are available from the Framester SPARQL endpoint
Linked Metaphors
International audienceThe poster summarizes Amnestic Forgery, an ontology for metaphor semantics, based on MetaNet and Framester factual-linguistic linked data. An example of metaphor generation based on linked metaphors is shown
Recommended from our members
Early analysis and debugging of linked open data cubes
The release of the Data Cube Vocabulary specification introduces a standardised method for publishing statistics following the linked data principles. However, a statistical dataset can be very complex, and so understanding how to get value out of it may be hard. Analysts need the ability to quickly grasp the content of the data to be able to make use of it appropriately. In addition, while remodelling the data, data cube publishers need support to detect bugs and issues in the structure or content of the dataset. There are several aspects of RDF, the Data Cube vocabulary and linked data that can help with these issues. One of the features of an RDF dataset is to be "self-descriptive". Here, we attempt to answer the question "How feasible is it to use this feature to give an overview of the data in a way that would facilitate debugging and exploration of statistical linked open data?" We present a tool that automatically builds interactive facets as diagrams out of a Data Cube representation without prior knowledge of the data content to be used for debugging and early analysis. We show how this tool can be used on a large, complex dataset and we discuss the potential of this approach
Recommended from our members
Bottom-Up Ontology Construction with Contento
In this demo paper we show an approach to build Semantic Web ontologies from sample linked data with a tool named Contento. Contento is a data driven ontology construction kit, based on Formal Concept Analysis (FCA). We show the exploration and analysis func- tionalities of Contento, as well as the method to generate, annotate and prune concept hierarchies. Moreover, we describe a procedure to go from sample data - extracted from SPARQL endpoints - to a new OWL on- tology
Experiments on real-life emotions challenge Ekman's model
Ekman's emotions (1992) are defined as universal basic emotions. Over the years, alternative models have emerged (e.g. Greene and Haidt 2002; Barrett 2017) describing emotions as social and linguistic constructions. The variety of models existing today raises the question of whether the abstraction provided by such models is sufficient as a descriptive/predictive tool for representing real-life emotional situations. Our study presents a social inquiry to test whether traditional models are sufficient to capture the complexity of daily life emotions, reported in a textual context. The intent of the study is to establish the human-subject agreement rate in an annotated corpus based on Ekman's theory (Entity-Level Tweets Emotional Analysis) and the human-subject agreement rate when using Ekman's emotions to annotate sentences that don't respect the Ekman's model (The Dictionary of Obscure Sorrows). Furthermore, we investigated how much alexithymia can influence the human ability to detect and categorise emotions. On a total sample of 114 subjects, our results show low within subjects agreement rates for both datasets, particularly for subjects with low levels of alexithymia; low levels of agreement with the original annotations; frequent use of emotions based on Ekman model, particularly negative one, in people with high levels of alexithymia
Semantic Role Labeling for Knowledge Graph Extraction from Text
This paper introduces TakeFive, a new semantic role labeling method that transforms a text into a frame-oriented knowledge graph. It performs dependency parsing, identifies the words that evoke lexical frames, locates the roles and fillers for each frame, runs coercion techniques, and formalizes the results as a knowledge graph. This formal representation complies with the frame semantics used in Framester, a factual-linguistic linked data resource. We tested our method on the WSJ section of the Peen Treebank annotated with VerbNet and PropBank labels and on the Brown corpus. The evaluation has been performed according to the CoNLL Shared Task on Joint Parsing of Syntactic and Semantic Dependencies. The obtained precision, recall, and F1 values indicate that TakeFive is competitive with other existing methods such as SEMAFOR, Pikes, PathLSTM, and FRED. We finally discuss how to combine TakeFive and FRED, obtaining higher values of precision, recall, and F1 measure
- …